AIbase
Home
AI Tools
AI Models
MCP
AI NEWS
EN
Model Selection
Tags
Llama 2 architecture

# Llama 2 architecture

Tinyllama 42M Fp32
MIT
This is a 42M-parameter Llama 2 architecture float32 precision model trained on the TinyStories dataset, suitable for simple text generation tasks.
Large Language Model Transformers
T
nickypro
517
3
Tinyllama 110M
MIT
This is a 110-million-parameter Llama 2 architecture model trained on the TinyStories dataset, suitable for lightweight text generation tasks.
Large Language Model Transformers
T
nickypro
1,472
5
Tinyllama 15M
MIT
A 15-million-parameter Llama 2 architecture model trained on the TinyStories dataset
Large Language Model Transformers
T
nickypro
3,217
11
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
English简体中文繁體中文にほんご
© 2025AIbase